Iterative minimization of the Rayleigh quotient by block steepest descent iterations

نویسندگان

  • Klaus Neymeyr
  • Ming Zhou
چکیده

The topic of this paper is the convergence analysis of subspace gradient iterations for the simultaneous computation of a few of the smallest eigenvalues plus eigenvectors of a symmetric and positive definite matrix pair (A,M). The methods are based on subspace iterations for A−1M and use the Rayleigh-Ritz procedure for convergence acceleration. New sharp convergence estimates are proved by generalizing estimates which have been presented for vectorial steepest descent iterations (see SIAM J. Matrix Anal. Appl., 32(2):443-456, 2011).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence Analysis of Restarted Krylov Subspace Eigensolvers

The A-gradient minimization of the Rayleigh quotient allows to construct robust and fastconvergent eigensolvers for the generalized eigenvalue problem for (A,M) with symmetric and positive definite matrices. The A-gradient steepest descent iteration is the simplest case of more general restarted Krylov subspace iterations for the special case that all step-wise generated Krylov subspaces are tw...

متن کامل

Convergence Analysis of Gradient Iterations for the Symmetric Eigenvalue Problem

Gradient iterations for the Rayleigh quotient are simple and robust solvers to determine a few of the smallest eigenvalues together with the associated eigenvectors of (generalized) matrix eigenvalue problems for symmetric matrices. Sharp convergence estimates for the Ritz values and Ritz vectors are derived for various steepest descent/ascent gradient iterations. The analysis shows that poores...

متن کامل

Minimization of Tikhonov Functionals in Banach Spaces

Tikhonov functionals are known to be well suited for obtaining regularized solutions of linear operator equations. We analyze two iterative methods for finding the minimizer of norm-based Tikhonov functionals in Banach spaces. One is the steepest descent method, whereby the iterations are directly carried out in the underlying space, and the other one performs iterations in the dual space. We p...

متن کامل

Residual norm steepest descent based iterative algorithms for Sylvester tensor equations

Consider the following consistent Sylvester tensor equation[mathscr{X}times_1 A +mathscr{X}times_2 B+mathscr{X}times_3 C=mathscr{D},]where the matrices $A,B, C$ and the tensor $mathscr{D}$ are given and $mathscr{X}$ is the unknown tensor. The current paper concerns with examining a simple and neat framework for accelerating the speed of convergence of the gradient-based iterative algorithm and ...

متن کامل

The Block Preconditioned Steepest Descent Iteration for Elliptic Operator Eigenvalue Problems

The block preconditioned steepest descent iteration is an iterative eigensolver for subspace eigenvalue and eigenvector computations. An important area of application of the method is the approximate solution of mesh eigenproblems for self-adjoint and elliptic partial differential operators. The subspace iteration allows to compute some of the smallest eigenvalues together with the associated i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Numerical Lin. Alg. with Applic.

دوره 21  شماره 

صفحات  -

تاریخ انتشار 2014